Goto

Collaborating Authors

 autopilot feature


Heart-stopping moment Tesla owner nearly plows into a moving TRAIN in 'self-drive' mode (and he says it wasn't the first time!)

Daily Mail - Science & tech

A Tesla owner is blaming his vehicle's Full Self-Driving feature for veering toward an oncoming train before he could intervene. Craig Doty II, from Ohio, was driving down a road at night earlier this month when dashcam footage showed his Tesla quickly approaching a passing train with no sign of slowing down. He claimed his vehicle was in Full Self-Driving (FSD) mode at the time and didn't slow down despite the train crossing the road - but did not specify the make or model of the car. In the video, the driver appears to have been forced to intervene by veering right through the railway crossing sign and coming to a stop mere feet from the moving train. Tesla has faced numerous lawsuits from owners who claimed the FSD or Autopilot feature caused them to crash because it didn't stop for another vehicle or swerved into an object, in some cases claiming the lives of drivers involved.


Tesla Autopilot feature was involved in 13 fatal crashes, US regulator says

The Guardian

US auto-safety regulators said on Friday that their investigation into Tesla's Autopilot had identified at least 13 fatal crashes in which the feature had been involved. The investigation also found the electric carmaker's claims did not match up with reality. The National Highway Traffic Safety Administration (NHTSA) disclosed on Friday that during its three-year Autopilot safety investigation, which it launched in August 2021, it identified at least 13 Tesla crashes involving one or more death, and many more involving serious injuries, in which "foreseeable driver misuse of the system played an apparent role". It also found evidence that "Tesla's weak driver engagement system was not appropriate for Autopilot's permissive operating capabilities", which resulted in a "critical safety gap". The NHTSA also raised concerns that Tesla's Autopilot name "may lead drivers to believe that the automation has greater capabilities than it does and invite drivers to overly trust the automation".


Tesla prevails in US lawsuit alleging autopilot was at fault in fatal crash

The Guardian

Tesla on Tuesday won the first US trial over allegations that its autopilot driver assistance feature led to a death, a major victory for the automaker as it faces several similar lawsuits across the country. The case, in a California state court, was filed by two passengers in a 2019 crash who accused the company of knowing the autopilot feature was defective when it sold the car. Tesla argued that human error caused the crash. The 12-member jury on Tuesday announced they had found the vehicle did not have a manufacturing defect. The verdict came on the fourth day of deliberations, and the vote was 9-3.


US jury hands Tesla sweeping win over Autopilot feature

Al Jazeera

A California state court jury has handed Tesla Inc a sweeping win, finding that the carmaker's Autopilot feature did not fail to perform safely in what appears to be the first trial related to a crash involving the partially automated driving software. The verdict could be an important victory for Tesla as it tests and rolls out its Autopilot and more advanced "Full Self-Driving (FSD)" system, which Chief Executive Elon Musk has touted as crucial to his company's future, but which has drawn regulatory and legal scrutiny. Justine Hsu, a resident of Los Angeles, sued the electric vehicle maker in 2020, saying her Tesla Model S swerved into a curb while it was on Autopilot and then an airbag was deployed "so violently it fractured Plaintiff's jaw, knocked out teeth, and caused nerve damage to her face". She alleged there were defects in the design of Autopilot and the airbag, and sought more than $3m in damages for the alleged defects and other claims. Tesla denied liability for the 2019 accident.


US Expands Safety Probe Into Tesla Autopilot

International Business Times

US regulators expanded a probe into Tesla's "Autopilot" system, moving the investigation closer to a potential recall of a controversial feature in Elon Musk's electric vehicles. The National Highway Traffic Safety Administration is investigating whether "Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver's supervision," according to a summary statement. The agency now considers the probe an "engineering analysis" -- which in NHTSA parlance upgrades the status from a "preliminary evaluation" -- to determine "whether a safety recall should be initiated or the investigation should be closed." Tesla did not immediately respond to a request for comment. NHTSA opened the probe in August 2021 after identifying 11 crashes involving a first responder vehicle and a Tesla in which Autopilot or Traffic Aware Cruise Control was engaged, and five additional cases were later found that fit into this group.


Tesla Owner Claims Car Was Acting Like A 'Drunk Driver' In Autopilot: Here's What Happened

International Business Times

An owner of a Tesla (TSLA) Model 3 has claimed that the electric car performed so poorly using the autopilot feature that it acted like a "drunk novice driver," prompting them to sue the automaker to buy back the EV. A judge from the Darmstadt Regional Court in Germany ordered Tesla to buy back the Model 3 for $76,000 from the unhappy vehicle owner, according to Electrek. Tesla has appealed the court's decision, contending that the autopilot problems could have been relieved with a software upgrade free of charge, German news outlet Spiegel reported. The complaint, according to Spiegel, alleged that the owner of the Model 3 paid nearly $7,000 for the autopilot function, which they claimed "did not work." The plaintiff said that "The steering behavior at entrances and exits or motorway junctions is spongy and resembles that of a'drunk novice driver,'" adding that it did not recognize traffic lights and stop signs, according to Spiegel.


In 2021, Tesla's phenomenal profits were offset by constant crisis

Engadget

The close of 2021 finds Tesla wealthier than ever -- and, in CEO Elon Musk's case, wealthier than everybody else. The electric vehicle manufacturer notched records for both deliveries and profits this year despite a global chip shortage that decimated supply chains worldwide, effectively kneecapping the rest of the automotive industry's production capacity. However its financial successes were often overshadowed by Tesla's continuing production quality issues, multiple NHTSA and SEC investigations, high profile failures of its vaunted "Full Self Driving" system, as well as numerous vehicle recalls and delays for upcoming models. And with existing industry stalwarts like Ford, GM, Honda and the Volkswagen Group making concerted efforts to electrify their own offerings, could 2022 be the year that Tesla's reign as top EV automaker finally ends? The company entered this year having met its 2020 goal of producing a half-million vehicles (of which it delivered 499,550 to customers), a nearly 133,000 unit increase over 2019.


Deep Learning in the Cloud

#artificialintelligence

As massive amounts of data are stored every second, it allows for the opportunity to create meaningful and revolutionizing models. This data comes in several forms, including text, images and videos, all allowing for advanced models to be created using techniques such as Deep Learning. Further, using the extensive amount of data, applications using technologies such as computer vision are being used in products such as self-driving cars and facial recognition in phones. When creating a Deep Learning application, one of the first decisions to be made is where the model will be trained, either locally on a machine or through a third-party cloud provider. This is an important decision to be made as it could significantly impact the training time of a model.


The Most Disturbing Part of the Latest Tesla Crash

Slate

Two men died near Houston, Texas, on Saturday while riding in a 2019 Tesla Model S that, according to local authorities, was speeding into a turn and ended up going off the road and crashing into a tree. It took first responders four hours and more than 30,000 gallons of water to put out the resulting fire, which kept reigniting; when damaged, the lithium ion batteries in electric cars can cause fires that are very difficult to extinguish because of how they store energy. Authorities reportedly attempted to ask Tesla for advice on how to put out the fire, but it's unclear whether they ended up getting any help. Besides the fire, there was something especially disturbing about the crash: No one was in the driver's seat. One of the men was in the passenger seat and the other in the rear.


Tesla settles with ex-employee over Autopilot code theft accusations

Engadget

Tesla has settled with a former employee that it sued for downloading data related to its Autopilot feature, Reuters has reported. Tesla filed the lawsuit against Cao Guangzhi back in 2019, accusing its former engineer of copying data to an iCloud account and taking it to his new employer, China's XMotors (owned by Xpeng). Cao reportedly made a monetary payment to Tesla as part of the terms of settlement, but the amount and other details were not disclosed. Cao's legal representative confirmed the settlement, saying he never provided Tesla information to XMotors or any other company. XMotors was not a party in the case, and said it developed its own self-driving technology in-house and respected intellectual property rights.